Tracking probabilistic truths: a logic for statistical learning

نویسندگان

چکیده

Abstract We propose a new model for forming and revising beliefs about unknown probabilities. To go beyond what is known with certainty represent the agent’s probability, we consider plausibility map, associating to each possible distribution ranking. Beliefs are defined as in Belief Revision Theory, terms of truth most plausible worlds (or more generally, all that enough). two forms conditioning or belief update, corresponding acquisition types information: (1) learning observable evidence obtained by repeated sampling from distribution; (2) higher-order information distribution. The first changes only map (via ‘plausibilistic’ version Bayes’ Rule), but leaves given set distributions essentially unchanged; second rules out some distributions, thus shrinking possibilities, without changing their ordering.. look at stability under either these learning, defining related notions (safe statistical knowledge), well measure verisimilitude model. prove number convergence results, showing how our track true probability after sampling, she eventually gains sense (statistical) knowledge probability. Finally, sketch contours dynamic doxastic logic learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Statistical Inference for Probabilistic Constraint Logic Programming

Most approaches to probabilistic logic programming deal with deduction systems and xpoint semantics for programming systems with user-speci ed weights attached to the formulae of the language, i.e, the aim is to connect logical inference and probabilistic inference. However, such a user-speci c determination of weights is not reusable and often complex. In various applications, automatic method...

متن کامل

Statistical Unfolded Logic Learning

During the past decade, Statistical Relational Learning (SRL) and Probabilistic Inductive Logic Programming (PILP), owing to their strength in capturing structure information, have attracted much attention for learning relational models such as weighted logic rules. Typically, a generative model is assumed for the structured joint distribution, and the learning process is accomplished in an eno...

متن کامل

Knowledge-Based Probabilistic Logic Learning

Advice giving has been long explored in artificial intelligence to build robust learning algorithms. We consider advice giving in relational domains where the noise is systematic. The advice is provided as logical statements that are then explicitly considered by the learning algorithm at every update. Our empirical evidence proves that human advice can effectively accelerate learning in noisy ...

متن کامل

A Sparse Probabilistic Learning Algorithm for Real-Time Tracking

This paper addresses the problem of applying powerful pattern recognition algorithms based on kernels to efficient visual tracking. Recently Avidan [1] has shown that object recognizers using kernel-SVMs can be elegantly adapted to localization by means of spatial perturbation of the SVM, using optic flow. Whereas Avidan’s SVM applies to each frame of a video independently of other frames, the ...

متن کامل

A Sparse Parameter Learning Method for Probabilistic Logic Programs

We propose a new parameter learning algorithm for ProbLog, which is an extension of a logic program that can perform probabilistic inferences. Our algorithm differs from previous parameter learning algorithms for probabilistic logic program (PLP) models on the point that it tries to reduce the number of probabilistic parameters contained in the estimated program. Since the amount of computation...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Synthese

سال: 2021

ISSN: ['0039-7857', '1573-0964']

DOI: https://doi.org/10.1007/s11229-021-03193-6